On tensor products of matrix factorizations
نویسندگان
چکیده
Let K be a field. f∈K[[x1,...,xr]] and g∈K[[y1,...,ys]] nonzero elements. If X (resp. Y) is matrix factorization of f g), Yoshino had constructed tensor product (of factorizations) ⊗ˆ such that X⊗ˆY f+g∈K[[x1,...,xr,y1,...,ys]]. In this paper, we propose bifunctorial operation ⊗˜ its variant ⊗˜′ X⊗˜Y X⊗˜′Y are two different factorizations fg∈K[[x1,...,xr,y1,...,ys]]. We call the multiplicative Y. Several properties proved. Moreover, find three functorial variants Yoshino's ⊗ˆ. Then, (or variant) used in conjunction with any variants) to give an improved version standard algorithm for factoring polynomials using matrices on class summand-reducible defined paper. Our produces factors whose size at most one half obtains method.
منابع مشابه
Matrix Factorizations, Minimal Models and Massey Products
We present a method to compute the full non–linear deformations of matrix factorizations for ADE minimal models. This method is based on the calculation of higher products in the cohomology, called Massey products. The algorithm yields a polynomial ring whose vanishing relations encode the obstructions of the deformations of the D–branes characterized by these matrix factorizations. This coinci...
متن کاملOrlov’s Equivalence and Tensor Products: from Sheaves to Matrix Factorizations and Back
A special case of a theorem due to Orlov states that for a hypersurface X ⊂ Pn−1 of degree n given by the equation W = 0, there exists an equivalence between the bounded derived category D(cohX) of coherent sheaves on X and the homotopy category HMF(W ) of graded matrix factorizations. We first give a description of this result, and present some methods for doing calculations with it. In the la...
متن کاملOn Kronecker products, tensor products and matrix differential calculus
The algebra of the Kronecker products of matrices is recapitulated using a notation that reveals the tensor structures of the matrices. It is claimed that many of the difficulties that are encountered in working with the algebra can be alleviated by paying close attention to the indices that are concealed beneath the conventional matrix notation. The vectorisation operations and the commutation...
متن کاملTurbo-SMT: Parallel coupled sparse matrix-Tensor factorizations and applications
How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like 'edible', 'fits in hand')? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the Coupled Matrix-Tensor Factorization (CMTF) problem. Can we enhance any CMTF sol...
متن کاملTurbo-SMT: Accelerating Coupled Sparse Matrix-Tensor Factorizations by 200x
How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like ’edible’, ’fits in hand’)? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the Coupled MatrixTensor Factorization (CMTF) problem. Can we accelerate any CMTF s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Algebra
سال: 2022
ISSN: ['1090-266X', '0021-8693']
DOI: https://doi.org/10.1016/j.jalgebra.2022.05.034